翻訳と辞書
Words near each other
・ Generalized keyboard
・ Generalized Korteweg–de Vries equation
・ Generalized Lagrangian mean
・ Generalized least squares
・ Generalized lentiginosis
・ Generalized lifting
・ Generalized linear array model
・ Generalized linear mixed model
・ Generalized linear model
・ Generalized logistic distribution
・ Generalized Lotka–Volterra equation
・ Generalized lymphadenopathy
・ Generalized map
・ Generalized Maxwell model
・ Generalized mean
Generalized method of moments
・ Generalized minimal residual method
・ Generalized minimum-distance decoding
・ Generalized Multi-Protocol Label Switching
・ Generalized multidimensional scaling
・ Generalized multivariate log-gamma distribution
・ Generalized Music Plug-in Interface
・ Generalized Newtonian fluid
・ Generalized nondeterministic finite automaton
・ Generalized normal distribution
・ Generalized other
・ Generalized Ozaki cost function
・ Generalized p-value
・ Generalized Pareto distribution
・ Generalized permutation matrix


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Generalized method of moments : ウィキペディア英語版
Generalized method of moments
In econometrics, the generalized method of moments (GMM) is a generic method for estimating parameters in statistical models. Usually it is applied in the context of semiparametric models, where the parameter of interest is finite-dimensional, whereas the full shape of the distribution function of the data may not be known, and therefore maximum likelihood estimation is not applicable.
The method requires that a certain number of ''moment conditions'' were specified for the model. These moment conditions are functions of the model parameters and the data, such that their expectation is zero at the true values of the parameters. The GMM method then minimizes a certain norm of the sample averages of the moment conditions.
The GMM estimators are known to be consistent, asymptotically normal, and efficient in the class of all estimators that don’t use any extra information aside from that contained in the moment conditions.
GMM was developed by Lars Peter Hansen in 1982 as a generalization of the method of moments which was introduced by Karl Pearson in 1894. Hansen shared the 2013 Nobel Prize in Economics in part for this work.
== Description ==
Suppose the available data consists of ''T'' observations , where each observation ''Yt'' is an ''n''-dimensional multivariate random variable. We assume that the data come from a certain statistical model, defined up to an unknown parameter . The goal of the estimation problem is to find the “true” value of this parameter, ''θ''0, or at least a reasonably close estimate.
A general assumption of GMM is that the data ''Yt'' be generated by a weakly stationary ergodic stochastic process. (The case of independent and identically distributed (iid) variables ''Yt'' is a special case of this condition.)
In order to apply GMM, we need to have "moment conditions", i.e. we need to know a vector-valued function ''g''(''Y'',''θ'') such that
:
m(\theta_0) \equiv \operatorname()=0,

where E denotes expectation, and ''Yt'' is a generic observation. Moreover, the function ''m''(''θ'') must differ from zero for , or otherwise the parameter ''θ'' will not be point-identified.
The basic idea behind GMM is to replace the theoretical expected value E() with its empirical analog — sample average:
:
\hat(\theta) \equiv \frac\sum_^T g(Y_t,\theta)

and then to minimize the norm of this expression with respect to ''θ''. The minimizing value of ''θ'' is our estimate for ''θ''0.
By the law of large numbers, \scriptstyle\hat(\theta)\,\approx\;\operatorname()\,=\,m(\theta) for large values of ''T'', and thus we expect that \scriptstyle\hat(\theta_0)\;\approx\;m(\theta_0)\;=\;0. The generalized method of moments looks for a number \scriptstyle\hat\theta which would make \scriptstyle\hat(\;\!\hat\theta\;\!) as close to zero as possible. Mathematically, this is equivalent to minimizing a certain norm of \scriptstyle\hat(\theta) (norm of ''m'', denoted as ||''m''||, measures the distance between ''m'' and zero). The properties of the resulting estimator will depend on the particular choice of the norm function, and therefore the theory of GMM considers an entire family of norms, defined as
:
\| \hat(\theta) \|^2_ = \hat(\theta)'\,W\hat(\theta),

where ''W'' is a positive-definite weighting matrix, and ''m′'' denotes transposition. In practice, the weighting matrix ''W'' is computed based on the available data set, which will be denoted as \scriptstyle\hat. Thus, the GMM estimator can be written as
:
\hat\theta = \operatorname\min_ \bigg(\frac\sum_^T g(Y_t,\theta)\bigg)' \hat \bigg(\frac\sum_^T g(Y_t,\theta)\bigg)

Under suitable conditions this estimator is consistent, asymptotically normal, and with right choice of weighting matrix \scriptstyle\hat also asymptotically efficient.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Generalized method of moments」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.